New Interior-Point Approach for One- and Two-Class Linear Support Vector Machines Using Multiple Variable Splitting

نویسندگان

چکیده

Abstract Multiple variable splitting is a general technique for decomposing problems by using copies of variables and additional linking constraints that equate their values. The resulting large optimization problem can be solved with specialized interior-point method exploits the structure computes Newton direction combination direct iterative solvers (i.e. Cholesky factorizations preconditioned conjugate gradients linear systems related to, respectively, subproblems new constraints). present work applies this to solving real-world binary classification novelty (or outlier) detection means of, two-class one-class support vector machines (SVMs). Unlike previous approaches SVMs, which were practical only low-dimensional points, proposal also deal high-dimensional data. compared state-of-the-art SVMs are based on either algorithms (such as SVM-OOPS) or specific developed machine learning community LIBSVM LIBLINEAR). computational results show that, competitive not against methods—and much more efficient than they data—but LIBSVM, whereas LIBLINEAR generally outperformed proposal. For consistently all other approaches, in terms solution time quality.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Interior-Point Methods for Massive Support Vector Machines

We investigate the use of interior-point methods for solving quadratic programming problems with a small number of linear constraints, where the quadratic term consists of a low-rank update to a positive semidefinite matrix. Several formulations of the support vector machine fit into this category. An interesting feature of these particular problems is the volume of data, which can lead to quad...

متن کامل

Fuzzy one-class support vector machines

In one-class classification, the problem is to distinguish one class of data from the rest of the feature space. It is important in many applications where one of the classes is characterized well, while no measurements are available for the other class. Schölkopf et al. first introduced a method of adapting the support vector machine (SVM) methodology to the one-class classification problem, c...

متن کامل

Face Authentication Using One-Class Support Vector Machines

This paper proposes a new method for personal identity verification based the analysis of face images applying One Class Support Vector Machines. This is a recently introduced kernel method to build a unary classifier to be trained by using only positive examples, avoiding the sensible choice of the impostor set typical of standard binary Support Vector Machines. The features of this classifier...

متن کامل

Hierarchical Clustering Using One-Class Support Vector Machines

This paper presents a novel hierarchical clustering method using support vector machines. A common approach for hierarchical clustering is to use distance for the task. However, different choices for computing inter-cluster distances often lead to fairly distinct clustering outcomes, causing interpretation difficulties in practice. In this paper, we propose to use a one-class support vector mac...

متن کامل

Soft clustering using weighted one-class support vector machines

Article history: Received 29 August 2007 Received in revised form 6 May 2008 Accepted 12 July 2008

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Journal of Optimization Theory and Applications

سال: 2022

ISSN: ['0022-3239', '1573-2878']

DOI: https://doi.org/10.1007/s10957-022-02103-1